Hierarchical Dirichlet Processes
نویسندگان
چکیده
Figure 1: Bayesian Mixture Model For a Bayesian Mixture Model as shown in figure 1, as k →∞, we shall have G = ∑∞ c=1 πcδφc , where all the φc are i.i.d. samples from G0, while the random sequence {πc}c=1 sum up to one, shall be constructed by the “Stick Breaking” process [3]. Suppose there is a stick with length 1. Let βc ∼ Beta(1, α) for c = 1, 2, 3, . . . , and regard them as fractions we take away from the remainder of the stick every time. Then πc can be calculated by the length we take away each time. π1 = β1, π2 = (1− β1)β2, . . . , πc = βc c−1 ∏
منابع مشابه
Time-Varying Topic Models using Dependent Dirichlet Processes
We lay the ground for extending Dirichlet Processes based clustering and factor models to explicitly include variability as a function of time (or other known covariates) by integrating a Dependent Dirichlet Processes into existing hierarchical topic models. Time-Varying Topic Models using Dependent Dirichlet Processes Nathan Srebro Sam Roweis Dept. of Computer Science, University of Toronto, C...
متن کاملOnline Data Clustering Using Variational Learning of a Hierarchical Dirichlet Process Mixture of Dirichlet Distributions
This paper proposes an online clustering approach based on both hierarchical Dirichlet processes and Dirichlet distributions. The deployment of hierarchical Dirichlet processes allows to resolve difficulties related to model selection thanks to its nonparametric nature that arises in the face of unknown number of mixture components. The consideration of the Dirichlet distribution is justified b...
متن کاملHierarchical Dirichlet Processes
We consider problems involving groups of data, where each observation within a group is a draw from a mixture model, and where it is desirable to share mixture components between groups. We assume that the number of mixture components is unknown a priori and is to be inferred from the data. In this setting it is natural to consider sets of Dirichlet processes, one for each group, where the well...
متن کاملHierarchical Double Dirichlet Process Mixture of Gaussian Processes
We consider an infinite mixture model of Gaussian processes that share mixture components between nonlocal clusters in data. Meeds and Osindero (2006) use a single Dirichlet process prior to specify a mixture of Gaussian processes using an infinite number of experts. In this paper, we extend this approach to allow for experts to be shared non-locally across the input domain. This is accomplishe...
متن کاملBorrowing strengh in hierarchical Bayes: Posterior concentration of the Dirichlet base measure
This paper studies posterior concentration behavior of the base probability measure of a Dirichlet measure, given observations associated with the sampled Dirichlet processes, as the number of observations tends to infinity. The base measure itself is endowed with another Dirichlet prior, a construction known as the hierarchical Dirichlet processes (Teh et al. [J. Amer. Statist. Assoc. 101 (200...
متن کاملar X iv : 1 30 1 . 08 02 v 3 [ m at h . ST ] 2 9 Ja n 20 15 Borrowing strengh in hierarchical Bayes : convergence of the Dirichlet base measure 1
This paper studies posterior concentration behavior of the base probability measure of a Dirichlet measure, given observations associated with the sampled Dirichlet processes, as the number of observations tends to infinity. The base measure itself is endowed with another Dirichlet prior, a construction known as the hierarchical Dirichlet processes [Teh et al., 2006]. Convergence rates are esta...
متن کامل